Computing the Kullback-Leibler Divergence between two Weibull Distributions

نویسنده

  • Christian Bauckhage
چکیده

We derive a closed form solution for the Kullback-Leibler divergence between two Weibull distributions. These notes are meant as reference material and intended to provide a guided tour towards a result that is often mentioned but seldom made explicit in the literature. 1 The Weibull Distribution TheWeibull distribution is the type III extreme value distribution; its probability density function is defined for x ∈ [0,∞) and given by f(x | k, l) = k l x l )k−1 exp [ − x l )k ] (1) where k > 0 and l > 0 are shape and scale parameters, respectively. This is a rather flexible, unimodal density. Depending on the choice of k and l, it may be skewed to the left or to the right. For k = 1, the Weibull coincides with the Exponential distribution and for k ≈ 3.5, it approaches the Normal distribution. A excellent account of the origins of the Weibull distribution is given in [1]. Among others, it was introduced as a plausible failure rate model [2] and has been frequently used for life-time analysis in materialor actuary studies ever since. Extending its classical applications, it was reported to account well for statistics of dwell times onWeb sites [3], times people spend playing online games [4], or the dynamics of collective attention on the Web [5]. The Weibull also attracts interest in machine learning or pattern recognition where it was found to represent distributions of distances among feature vectors [6], has been used in texture analysis [7,8], or was shown to provide a continuous characterization of shortest paths distributions in random networks [9]. Accordingly, methods for measuring (dis)similarities of Weibull distributions are of practical interest in data science for they facilitate model selection and statistical inference. 2 The Kullback-Leibler Divergence The Kullback-Leibler (KL) divergence provides a non-symmetric measure of the similarity of two probability distributions P andQ [10]. In case both distributions are continuous, it is defined as

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of the Weibull parameters by Kullback-Leibler divergence of Survival functions

Recently, a new entropy based divergence measure has been introduced which is much like Kullback-Leibler divergence. This entropy measures the distance between an empirical and a prescribed survival function and is a lot easier to compute in continuous distributions than the K-L divergence. In this paper we show that this distance converges to zero with increasing sample size and we apply it to...

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Some Statistical Inferences on the Parameters of Records Weibull Distribution Using Entropy

 In this paper, we discuss different estimators of the records Weibull distribution parameters and also we apply the Kullback-Leibler divergence of survival function method to estimate record Weibull parameters. Finally, these estimators have been compared using Monte Carlo simulation and suggested good estimators.

متن کامل

Computing the Kullback-Leibler Divergence between two Generalized Gamma Distributions

We derive a closed form solution for the Kullback-Leibler divergence between two generalized gamma distributions. These notes are meant as a reference and provide a guided tour towards a result of practical interest that is rarely explicated in the literature. 1 The Generalized Gamma Distribution The origins of the generalized gamma distribution can be traced back to work of Amoroso in 1925 [1,...

متن کامل

Whai: Weibull Hybrid Autoencoding Inference for Deep Topic Modeling

To train an inference network jointly with a deep generative topic model, making it both scalable to big corpora and fast in out-of-sample prediction, we develop Weibull hybrid autoencoding inference (WHAI) for deep latent Dirichlet allocation, which infers posterior samples via a hybrid of stochastic-gradient MCMC and autoencoding variational Bayes. The generative network of WHAI has a hierarc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1310.3713  شماره 

صفحات  -

تاریخ انتشار 2013